Deit Tiny Distilled Patch16 224
Apache-2.0
This model is a distilled version of the Data-efficient image Transformer (DeiT), pretrained and fine-tuned on ImageNet-1k at 224x224 resolution, efficiently learning from a teacher model through distillation.
Image Classification
Transformers